专利摘要:
21 ABSTRACT The present invention relates to a driver attentiveness detection device comprising at least one digital camera device (2) and a control unit (5). The camera device (2) is arranged to detect eye configurations of a vehicle driver (4). The control unit (5) is arranged to compare the detected eye configurations with previously stored models of (l3, l4, l5, l6) eye configuration samples which are indicative of eyes that look inside and/or outside a predetermined field of view (6). The control unit (5) is further arranged to determine whether the detected eye configurations are lookinginside or outside the predetermined field of view from said comparison, and is also arranged to indicate when the vehicle driver (4) has been determined to be looking outside the predetermined field of view (6) to a predetermined extent. The present invention also relates to a corresponding method. (Fig. i)
公开号:SE1530037A1
申请号:SE1530037
申请日:2015-03-25
公开日:2015-10-04
发明作者:Benny Nilsson
申请人:Autoliv Dev;
IPC主号:
专利说明:

TITLE Driver attentiveness detection method and device DESCRIPTION OF THE INVENTION The present invention relates to a method for detectingdecreased attentiveness of a vehicle driver, the methodcomprising' the step of detecting' eye configurations of thevehicle driver.
The present invention also relates to a vehicle driver attentiveness detection device comprising at least one digitalcamera device and. a control unit, said. camera device beingarranged to detect eye configurations of a vehicle driver.
Detection of vehicle driver attentiveness is desirable since loss of, or in any way deteriorated, attentiveness impairs the ability of a vehicle to control the vehicle and to be aware of the surroundings. Examples of vehicles are motor vehicles,trains, aircraft and boats. It may also be desirable to detectattentiveness for operators of industrial equipment and. thelike. deteriorated attentiveness is that, lack of A problem regarding generally, persons do not detect their own attentiveness when it appears. It is thus difficult for a person to be aware of lack of attentiveness, and to take action for counteraction. Deteriorated attentiveness may be due to different factors such as distracting objects or gadgets as well as drowsiness.
Today, many devices and methods for detecting attentiveness of a vehicle driver* are known, and. in most cases one or more digital cameras capture images of a vehicle driver's head features and the position of the eyes in order to calculate a gaze angle, and to determine whether the gaze falls within agaze window. falls If the calculated gaze angle indicates that the gaze outside the gaze window for one or several predetermined amounts of time, it is determined that the driver is inattentive, which result in an alarm and/or other security actions.
Present attentiveness detection systems uses algorithms areusing' advanced. generic gaze and. headtracking software. Suchsoftware creates models of the face which are used to calculate the head and gaze directions. For these models to work, they must track several points on the eyes, nose and mouth. If some of these points are covered or tracked incorrectly, the performance degrades rapidly, leaving present systems fairly unstable. An example of such a system is disclosed in EP 2298155.
There is thus a need for a device and a method for detecting vehicle driver attentiveness which. is less complex and Inore robust than previously known equipment of this kind, and where the risk of false alerts or other* types of malfunctions is reduced.
Said object is achieved by means of a næthod for detecting decreased attentiveness of a vehicle driver, the method comprising' the step of detecting' eye configurations of the vehicle driver.
The method further comprises the steps: - Analyzing the detected. eye configurations by comparing the detected. eye configurations with previously stored models of eye configuration samples. The stored models of eye configuration samples are indicative of eyes that look inside and/or outside a predetermined field of view.
- Determining' whether* the detected. eye configurations arelooking inside the predetermined field of view or outside the predetermined field of view using said analysis.
- Indicating when the vehicle driver has been determined tobe looking outside the predetermined field of view to apredetermined extent.
Said. object is also achieved. by' means of a vehicle driver attentiveness detection device comprising at least one digital camera device and. a control unit, said. camera device being arranged to detect eye configurations of a vehicle driver. Thecontrol unit is arranged to compare the detected eyeconfigurations with previously stored models of eye configuration samples. The stored models of eye configuration samples are indicative of eyes that look inside and/or outsidea. predetermined. field. of view. The control unit is furtherarranged to determine whether the detected eye configurationsare looking inside the predetermined field of view or outsidethe predetermined field of view. The control unit isfurthermore arranged to indicate when the vehicle driver hasbeen determined to be looking outside the predetermined field of view to a predetermined extent.
According to an example, the indication of when the vehicledriver has been determined to be looking outside thepredetermined field of view to a predetermined extentcomprises the production of an output signal which is indicative of vehicle driver inattentiveness.
According to another example, the output signal is used for triggering an alarm and/or one or more vehicle safety systems.
According to another example, the predetermined field of viewis in the form of a volume that extends in a vehicle forwardrunning direction.
As an example, the volume may extend to an imaginary end surface, positioned. at a certain distance from. the driver,where the detected. eye configurations are determined. to be looking' inside the predetermined. field. of view if they are determined to be looking at the imaginary end surface.According to another example, so-called Haar features may be used for modeling eye configuration samples.
Other examples are disclosed in the dependent claims.
A number of advantages are obtained by means of the present invention. Mainly, a much less complicated device and methodfor detection of vehicle driver attentiveness by determiningwhere a driver is looking is provided, without the need forcalculating such things as gaze angles, head angles and models for facial features.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention. will now be described. more in detail with reference to the appended drawings, where: Figure l shows a schematic partial cut-open side view of a vehicle with a driver; Figure 2 shows a schematic side view of the vehicle with thedriver and a predetermined volume corresponding to a predetermined field of view; Figure 3 shows a schematic top view of the vehicle and. thepredetermined volume; Figure 4 shows a schematic front view of the driver and thepredetermined volume; Figure 5a shows a schematic first example of a driver with apositive eye configuration sample; Figure 5b shows a schematic second example of a driver with apositive eye configuration sample; Figure 5c shows a schematic third example of a driver with apositive eye configuration sample; Figure 6a shows a schematic first example of a driver with anegative eye configuration sample; Figure 6b shows a schematic second example of a driver with anegative eye configuration sample; Figure 6c shows a schematic third example of a driver with anegative eye configuration sample; Figure 7 shows a flowchart for an example of an initialization phase and an online phase; and Figure 8 shows a flowchart for a general method according to the present invention.
DETAILED DESCRIPTIONFigure l schematically shows a cut-open part of a vehicle l arranged. to run on a road. 12 in a direction. D, where the vehicle l comprises a near infrared (NIR) digital camera device 2 positioned on the dashboard with two NIR light sources F such as flashes 1, the (only one NIR light source is shown in Figure other NIR light source is assumed to be positioned on the other side of the camera device). The digital camera device 2 is arranged for capturing images of the eyes 3 of a person 4 driving the vehicle 1, a driver, and transferring these images to a control unit 5. These images are used for determining whether the driver 4 which is looking at the road. ahead. or not, in turn is a measure of the driver's attentiveness.
With reference to Figure 1, 2, 3, and 4, a volume 6 is defined in front of the driver 4, where the volume 6 is limited by four imaginary' border* walls 7, 8, 9, 10 that run from. the driver 4 towards a rectangular imaginary end surface 11 which is positioned a certain distance L from the driver 4 in the forward running direction D. The rectangular imaginary end surface 11 is most clearly indicated in Figure 4 as a dashed rectangle at an end of the imaginary border walls 7, 8, 9, 10.The imaginary end surface 11 may thus be regarded to either define the imaginary border walls 7, 8, 9, 10, or to be defined by the imaginary border walls 7, 8, 9, 10.As shown i11 Figure 2, the border walls 7, 8, 9, Figure 13 and Figure 4, imaginary 10 divert from. each other from. the driver* 4 towards the imaginary' end surface 11 such that a first imaginary border wall 7 and a second imaginary borderwall 8 face each other and the road 12 with a first elevationinclination dl and a second elevation inclination dg, and athird imaginary border wall 9 and a. fourth imaginary borderwall 10 face each other with a first azimuth inclination 81 anda second azimuth inclination 8% road 12. and run perpendicular to the Preferably, the first imaginary border wall 7 runs horizontally in a level that coincides with a standard driver's eyes, and the second imaginary border wall 8 is inclined such that its part of the imaginary end surface ll approximately is in level with the road l4. This is not necessary, but should be regarded as an example of the volume's extension.
The inclinations dh cm; B1, B2 are measured with respect to a reference line S, where the elevation inclinations dl dg I constitute elevation. zone angles dL d2 at each. side of the reference line S, the elevation zone angles for example being of the magnitude l5°-20°. The azimuth inclinations B1, B2 constitute azimuth zone angles B1, B2 at each side of the reference line S, the azimuth zone angles for example being of the magnitude lO°-20°. It is also conceivable that the zone angles in a plane are of unequal values.
The reference line S is in this example defined as a line thatruns through the volume 6 such that the elevation zone anglesdh d2 are mutually equal and such that the azimuth zone angles Bl, B2 are mutually equal.
As shown in Figure 3, and also as indicated. in Figure 4, keeping the azimuth zone angles Bl, B2 equal, the reference line S and thus the volume 6 is slightly inclined to the right in an azimuth plane with reference to the forward running direction D. This is due to the fact that in this example, the driver 4 is sitting on the left-hand side of the road l2 and is driving' on a right-hand sided lane 27 of the road l2.
Keeping the eyes 3 on the lane 27 in question means that thedriver 4 has to look slightly to the right with reference to the forward running direction D.
While it is determined that the driver 4 is looking inside the volume 6, i.e. looking at some part of the imaginary end surface ll, the driver 4 is considered to be attentive to the road ahead. On the other hand, when it is determined that the driver 4 is looking outside the volume 6, i.e. not looking at some part of the imaginary end surface ll, the driver 4 is considered to be inattentive to the road ahead.
According' to the present invention, the control unit 5 is arranged to determine whether the driver 4 is looking inside the volume 6 by analyzing detected images of the eyes 3, these images being indicative of certain eye configurations. The analysis is performed by comparing the detected eye configurations with previously stored models of l5, l6. eyeconfiguration samples l3, l4, The stored models of eyeconfiguration samples l3, l4, l5, l6 are look indicative of eyes that inside a predetermined field of view, in this example the volume 6 defined above. How these models of eye configuration samples are created will be discussed later in the description. The comparison is carried out by means of a suitable video processing algorithm. such as the well-known so-called Haar such in the field of Viola-Jones method 'using features, algorithms are well-known image processing, and details of these will not be discussed further.
If the analysis results in that the detected eyeconfigurations are determined to be looking outside thepredetermined field of view 6 to a predetermined extent, the control unit is arranged to produce an output signal that is indicative of vehicle driver inattentiveness. Thepredetermined. extent may for example be a certain time andalso exceeding a predetermined buffer as will be discussed in a later part of the description.
The predetermined extent may also relate to whether one or two eyes in the detected eye configurations are determined to be looking inside or outside the predetermined field of view 6. A detected eye configuration may thus be determined to be looking inside the predetermined field of view 6 if one eye inthe detected eye configuration is looking in the predeterminedthat the stored. models This means of eye l3, l4, l5, field. of 'view 6. configuration samples l6 are indicative of eyesthat look inside the predetermined field of view 6 if such amodel sample only is indicative of one eye that is looking in the predetermined field of view 6.
Such an output signal may result in a number of alternative actions. According' to one example, an alert system. may beactivated, such that an acoustic and/or optical signal istriggered. Furthermore, triggering of vibrating means in the chair or a steering wheel l7 is also conceivable, as well as activation of a motorized seatbelt retractor.
Preferably, a buffer routine is used to delay the determining that the driver is inattentive. For example, an eyes-off-the- road warning should not be triggered until around 3-5 seconds of eyes-off-the-road time has been buffered.
Before the buffer becomes active, there is an initialization phase where the control unit 5 is arranged to determine the passing of a. predetermined. time of continuous detected. eye configurations that are looking inside the predetermined field of view, an example of such a predetermined time is l-2 seconds, constituting an initialization buffer threshold. This is to verify that there is a driver 4 present, and that the system is working properly; for example when a driver 4 takes place in the vehicle l, or after a false alarm. If the time of continuous detected eye configurations that are looking inside the predetermined field of view is interrupted before the predetermined time is reached, the initialization phase is re- started. An initialization buffer may be used for keepingtrack of detected eye configurations in the initializationphase.
After a successful initialization phase, the normal buffer is active, and. the eye configurations of the driver* 4 are now detected for evaluating a possible decreased attentiveness of the driver 4 as disclosed above, in an online phase.
The control unit 5 is preferably arranged to force a buffer IG- initialization if the time increases past a predetermined time interval constituting a buffer threshold, for example 5-10 seconds, without the control unit 5 having determined that the vehicle driver is inattentive, thus without the control unit 5 issuing said signal. This is also with the intent to decrease the risk of false alarms. In this case, the control unit is arranged. to re-start the initialization phase.The above is indicated in Figure 7, where the initializationphase 28 and the online phase 29 mentioned above are disclosed in a flowchart.
In the initialization phase 28, the following steps areperformed: 30: Detect eye configurations. 31: Are detect eye configurations within the predetermined field of view 6 32: If “Yes”,33: If “No”, increase initialization buffer.decrease initialization buffer.34: Has the initialization buffer threshold been passed 35: If “Yes”,36: If then go to the online phase.
“No”, then go back to the start 30 of the initialization phase 28. 11 In the online phase 29, the following steps are performed: 37: Detect eye configurations. 38: Are detect eye configurations within the predeterminedfield of view 6 39: If “Yes”, 40: If “No”, decrease buffer.increase buffer.41: Has the buffer threshold been passed 42: If “Yes”, then go back to the start 30 of the initialization phase 28. 43: If “No”, has the threshold for issuing an output signal indicative of vehicle driver inattentiveness been passed 44: If “Yes”, issue an output signal indicative of vehicle driver inattentiveness, then go back to the start 36 of theonline phase 29.45: If 29.
“No”, then go back to the start 30 of the online phase To avoid false alarms, the control unit 5 is further preferably arranged to retain the output signal that is indicative of vehicle driver inattentiveness during and a certain. time after* certain conditions, such. as during' turns when the steering wheel angle exceeds a predetermined value, such as for example lO°-l5°, when the turn indicator isactivated and when turns are detected by means of GPS (GlobalPositioning System), camera devices and/or inertia sensors.
Alternatively, in a curve, the volume may be adapted to follow a driver's natural gaze when entering and running in the CUIVG .
In the above, the output signal has been described to activate an alert system, but as stated previously, such an output signal may result in a number of alternative actions.
Alternatively, or additionally, peripheral security systems 12 may' be enhanced, triggered. or activated, such as automatic braking systems. In this context, it should. be noted that alert systems normally are best suited for high speed driving, on for example highways, while many other security' systems such as automatic braking systems are adapted to work at low speed driving, for example in cities.
To trigger an early automatic braking, when applicable, an appropriate buffer time might be a few hundred milliseconds;enough to not activate on blinks but to activate relatively quickly when the driver looks away from the road.
In many applications it's also preferred to decrease the buffer* at a. faster' rate and ixn empty ii: after za predefined eyes-on-the-road time. All of the settings described above need to be specifically set depending on what kind of system that the buffer is connected to.
The present invention. is normally' arranged. either for high speed driving or low speed driving, but may also be arranged to work at both. An example of this will be discussed later.
When the present invention is arranged for high speed driving,the output signal may be retained at speeds falling below a predetermined. value, such. as for example 60 kph (kilometers per hour). In this way, false alarms at low speed driving, for example in cities, are avoided. This is due to the fact that in cities, the driver's eyes 3 are normally' looking' at the road close to the vehicle, and the head is usually moving in order to keep track of, for example, other vehicles, pedestrians, traffic lights and different signs. Such behavior would not be permitted at high speed driving. 13 The distance L to the imaginary end surface ll is for example lO meters, but this is only an example of a suitable length.
By positioning the imaginary end surface ll at such a distance away from the driver 4, parallax errors and sensitivity for different heights of drivers are lowered. A length of about lO meters is also best suited for the case when the present invention is arranged for high speed driving.
As an example, since the driver's eyes 3 normally are looking at the road close to the vehicle in cities, the distance L could be altered to be of a lesser magnitude when low speed driving' is detected, for example at speeds falling' below a predetermined value, such as the previously mentioned 60 kph.
In this way, the present invention may be arranged to run in both a high speed driving mode and a low speed driving mode by altering' the distance L in a suitable way' with. appropriate changes to the size of the imaginary end surface ll.
The control unit 5 is programmed with mathematical models of positive eye configuration samples and negative eye configuration samples. samples l3, l4, l5, Examples of positive eye configuration l6 are constituted by the images inside the dashed boxes in Figure 5a - Figure 5c. Examples of negative eye configuration samples can be found in Figure 6a - Figure 6c, where in this case the whole images are used for defining the negative eye configuration samples. The eye configuration samples are collected in any suitable way, andare converted to mathematical models, for example by means of the Viola-Jones method using said Haar features.
The process of acquiring mathematical models of positive eyeconfiguration samples and negative eye configuration samples is sometimes referred to as “training”. As an example of how the mathematical models of positive eye configuration samples 14 and negative eye configuration samples are acquired, how the so-called. “training” may' be performed, the following' method may be used.
First, the collected images are analyzed by means of an eye tracking arrangement which is adapted to detect gaze angels.
Eye tracking arrangements of different types are previously well-known. Then, positive eye configuration samples are selected, either automatically or manually. Finally, a suitable method such as the previously næntioned Viola-Jones method is used for finding suitable features.
The previously discussed elevation zone angles dL dg andazimuth zone angles Bl, Bg are used during the “training”, andthere is preferably a play of 3-5 degrees. In this way, a grey zone which is ignored in in-car use is created.
For a robust and generalized functionality, it is veryimportant to get a diverse selection of eye configurationsamples from different conditions and different types of facial features.
By means of the present invention, the problems inherent with the prior art are bypassed by classifying the driver's state of attentiveness by detecting the eyes 3 only, without calculating' any gaze or head. angles. By 'using image object detection algorithms, a classifier in the control unit 5 is trained to detect eyes only looking in a predetermined direction, such as within the volume 6 discussed above.
With reference to Figure 8, the present invention relates to a general method for detecting the level of alertness of a vehicle driver, the method comprising the steps: 22: Detecting eye configurations of the vehicle driver 4.23: Analyzing the detected eye configurations by comparing thedetected eye configurations with previously stored models of eye configuration samples 13, 14, 15, 16. The stored models of eye configuration. samples 13, 14, 15, 16 are indicative of eyes that look inside and/or outside a predetermined field ofview 6. 24: Determining' whether the detected. eye configurations arelooking inside the predetermined field of view 6 or outsidethe predetermined field of view 6 using said analysis. 25: Indicating when the vehicle driver 4 has been determined to be looking outside the predetermined field of view 6 to a predetermined extent.
The present invention is not limited to the examples above, but may vary freely within the scope of the appended claims.
For example, there may be one or more NIR cameras 2 and one or more NIR flashes F, where the control unit 5 is arranged to send trigger signals to the camera 2 and flashes F. The camera 2 should be placed as close as possible to the line of sightof the driver 4 when the driver 4 is looking forward. However, the placement is more important in the horizontal direction than the vertical direction. Other* types of camera systems, with or without flashes, are conceivable. One advantage with an infrared camera is that it does not matter if the driver's eyes are obscured, for example by sunglasses.
Examples for camera positions are at the steering wheel 17 rim or spokes, at the steering column 18, at the dashboard 19 or in the inner roof lining 20 as indicated in Figure 1. Other examples are airbag covers, sun visor, an inner rear-view mirror assembly 21, and in the vehicle A-pillars. 16 If the camera 2 is positioned straight in front of the driver 4, it is less complicated. to determine if the driver* 4 is looking inside the predetermined. field. of view due to more available data. However, for practical reasons this may not be possible due to space limitations, design reasons and view obstructive reasons.
The control unit 5 might be comprised by one or several units,and may also be comprised in another unit such as a vehicle restraint control unit.
The predetermined field of view and the imaginary end surface ll may have any suitable form, for example the imaginary end surface may' be oval or polygonal. The volume may thus be defined by more walls than the four walls discussed in the example above. It is also conceivable that there is only one wall that is suited for a round or oval imaginary end surface.
In the above, it is determined whether the driver 4 looks in the predetermined field of view by using stored models of eye configuration. samples that are indicative of eyes that look inside a predetermined field of view. It is possible that stored models of eye configuration samples that are indicativeof eyes that look outside a predetermined field of view, or both, are used instead.
The comparison. is carried. out by 1neans of a suitable video processing algorithm, such algorithms are well-known and details of these will not be discussed further. A typical video processing algorithm may use stored models of eye configuration. samples that are indicative of eyes that lookinside a predetermined field of view and stored models of headand eye configuration samples that are indicative of eyes that look outside a predetermined field of view as indicated above, 17 where it is stated that the whole images shown in Figure 6a - Figure 6c are used for defining' negative eye configurationsamples. This is of course only' an example of which. storedmodels that are needed and how they are used. Generally, stored models of eye configuration samples that are indicativeof eyes that look inside and/or outside a predetermined field of view are used.
Alternatively, it is possible that when eye configurations of the vehicle driver 4 have been detected, the analysis of the detected. eye configurations comprises identifying the pupil and/or iris with cornea of the detected eye. The next step then comprises comparing the pupil and/or iris with cornea of the detected eye with previously stored models of pupils and/or irises with cornea.
By comparing features such as position of pupil, shape of iris and positions of possible reflexes in the cornea, it may beestablished whether detected eye configurations constitutepositive eye configurations or negative eye configurations, i.e. whether the detected eyes look inside and/or outside the predetermined field of view 6.For example, the iris of a positive eye configuration may bemore circular than that of a negative eye configuration thatmay be more elliptical.
Furthermore, the possible reflexes in the cornea may be related to a center position of the eye.Generally, the control unit 5 is arranged to indicate that thevehicle driver 4 has been determined to be looking outside thepredetermined field of view 6 to a predetermined extent. Such an indication may comprise producing an output signal which is indicative of vehicle driver inattentiveness.
权利要求:
Claims (14)
[1] 1. A næthod for detecting decreased attentiveness of a(4), detecting eye configurations of the vehicle driver vehicle driver (22) the method comprising the step: (4):characterized in that the method further comprises the steps:(23) analyzing' the detected. eye configurations by comparing the detected eye configurations with previously stored models of eye configuration samples (13, 14, 15, 16), the storedmodels of eye configuration samples (13, 14, 15, 16) beingindicative of eyes that look inside and/or outside apredetermined field of view (6); (24) determining whether the detected eye configurations are looking inside the predetermined field of view (6) or outsidethe predetermined field of view (6) and (25) using said analysis; indicating when the vehicle driver (4) has been determined to be looking outside the predetermined field ofview (6) to a predetermined extent.
[2] 2. A method according to claim 1, characterized in thatthe step of indicating when the vehicle driver (4) has been determined to be looking outside the predetermined field of view (6) to a predetermined extent comprises producing anoutput signal which is indicative of vehicle driverinattentiveness.
[3] 3. A method according to claim 2, characterized in that the output signal is used for triggering an alarm and/or one or more vehicle safety systems.
[4] 4. A method according to any one of the previous claims,characterized in that the predetermined field of view (6) isin the forH1 of a volume that extends in a vehicle forward running direction (D). 19
[5] 5. A method according to claim 4, characterized in thatthe volume extends to an imaginary end surface (11),positioned. at a certain distance (L) from. the driver (4), where the detected. eye configurations are determined. to beif they are (11). looking inside the predetermined field of view (6) determined to be looking at the imaginary end surface
[6] 6. A method according to any one of the previous claims, characterized in that the stored models of eye configuration samples (13, 14, 15, 16) are modeled by using Haar features.
[7] 7. A driver attentiveness detection device comprising atleast one digital camera device (2) and a control unit (5),said camera device (2) being arranged to detect eye (4), characterized in that is arranged to compare the detected eye configurations of a vehicle driver the control unit (5) configurations with previously stored models of eyeconfiguration samples (13, 14, 15, 16), where the storedmodels of eye configuration samples (13, 14, 15, 16) areindicative of eyes that look inside and/or outside apredetermined field of view (6), the control unit (5) furtherbeing arranged. to determine whether the detected eye configurations are looking inside the predetermined field ofview or outside the predetermined field of view (6) from said comparison, where the control unit (5) furthermore is arranged to indicate when the vehicle driver (4) has been determined tobe looking outside the predetermined field of view (6) to a predetermined extent.
[8] 8. A device according to claim 7, characterized in thateach indication by the control unit (5) that the vehicledriver (4) has been determined to be looking outside the predetermined field of view (6) to a predetermined extent comprises the production of an output signal which is indicative of vehicle driver inattentiveness.
[9] 9. A device according to claim 8, characterized in thatthe output signal is arranged to trigger an alarm and/or one or more vehicle safety systems.
[10] 10. A device according to any one of the previous claims7-9, characterized in that the predetermined field of view (6)is in the form of a volume that extends in a vehicle forward running direction (D).
[11] 11. A device according to claim 10, characterized in thatthe volume extends to an imaginary end surface (11),positioned. at a certain distance (L) from. the driver (4),where the control unit (5) is arranged to determine thatdetected eye configurations are looking inside thepredetermined field of view (6) if the control unit (5) determines that they are looking at the imaginary end surface (11).
[12] 12. A device according to any one of the previous claims 7-11, characterized in that said camera device (2) is of the type Near Infrared, NIR.
[13] 13. A device according to claim 10, characterized in that the device further comprises at least one NIR flash (F).
[14] 14. A device according to any one of the previous claims,characterized in that said camera device (2)(17)at the dashboard is positioned atthe steering wheel (18), rim or spokes, (19), at the steering column in the inner roof lining (20), at airbag covers, at a sun visor, at an inner rear-view mirror assembly (21), or in vehicle A-pillars.
类似技术:
公开号 | 公开日 | 专利标题
US9834221B2|2017-12-05|Driver attentiveness detection method and device
CN106663377B|2019-04-09|The driving of driver is unable to condition checkout gear
US9041789B2|2015-05-26|System and method for determining driver alertness
JP5207249B2|2013-06-12|Driver condition monitoring system
US9542847B2|2017-01-10|Lane departure warning/assistance method and system having a threshold adjusted based on driver impairment determination using pupil size and driving patterns
US8063786B2|2011-11-22|Method of detecting drowsiness of a vehicle operator
EP1721782B1|2013-03-20|Driving support equipment for vehicles
US10800424B2|2020-10-13|Driver condition detection system
US20120206252A1|2012-08-16|Lane departure warning system
JPWO2008029802A1|2010-01-21|Driving information providing device
CN106471556A|2017-03-01|The driving of driver is unable to condition checkout gear
US20180201276A1|2018-07-19|Driver condition detection system
JP6454368B2|2019-01-16|Vehicle display system and method for controlling vehicle display system
KR20110119671A|2011-11-02|Safety system for a motor vehicle
US10023204B1|2018-07-17|Driving assisting method and driving assisting device using the same
KR101950476B1|2019-04-29|Driver state sensing system, driver state sensing method, and vehicle including thereof
JP4647387B2|2011-03-09|Vehicle driving support device
JP6103138B2|2017-03-29|Driver support system
KR101500016B1|2015-03-09|Lane Departure Warning System
JP6915503B2|2021-08-04|Driver monitor system
SE1530037A1|2015-10-04|Driver attentiveness detection method and device
KR101546893B1|2015-08-24|System for checking doze at the vehicle
JP2022038604A|2022-03-10|Vehicle control device
KR102038371B1|2019-11-26|Driver eye detecting device and method thereof
JP6880615B2|2021-06-02|Driver status judgment device and driving support system
同族专利:
公开号 | 公开日
SE539123C2|2017-04-11|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
EP3828755A1|2019-11-29|2021-06-02|Veoneer Sweden AB|Improved estimation of driver attention|
法律状态:
优先权:
申请号 | 申请日 | 专利标题
SE1450396|2014-04-03|
SE1530037A|SE539123C2|2014-04-03|2015-03-25|Method and device for detecting decreased attentiveness of vehicle driver|SE1530037A| SE539123C2|2014-04-03|2015-03-25|Method and device for detecting decreased attentiveness of vehicle driver|
[返回顶部]